A reverse entropy power inequality for log-concave random vectors

نویسندگان

  • Keith Ball
  • Piotr Nayar
  • Tomasz Tkocz
چکیده

We prove that the exponent of the entropy of one dimensional projections of a log-concave random vector defines a 1/5-seminorm. We make two conjectures concerning reverse entropy power inequalities in the log-concave setting and discuss some examples. 2010 Mathematics Subject Classification. Primary 94A17; Secondary 52A40, 60E15.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the entropy power inequality for the Rényi entropy of order [0, 1]

Using a sharp version of the reverse Young inequality, and a Rényi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive a Rényi entropy power inequality for log-concave random vectors when Rényi parameters belong to (0, 1). Furthermore, the estimates are shown to be somewhat sharp.

متن کامل

A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications

We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure |x− x̂|,...

متن کامل

Dimensional behaviour of entropy and information

We develop an information-theoretic perspective on some questions in convex geometry, providing for instance a new equipartition property for log-concave probability measures, some Gaussian comparison results for log-concave measures, an entropic formulation of the hyperplane conjecture, and a new reverse entropy power inequality for log-concave measures analogous to V. Milman’s reverse Brunn-M...

متن کامل

Wasserstein Stability of the Entropy Power Inequality for Log-Concave Densities

We establish quantitative stability results for the entropy power inequality (EPI). Specifically, we show that if uniformly log-concave densities nearly saturate the EPI, then they must be close to Gaussian densities in the quadratic Wasserstein distance. Further, if one of the densities is log-concave and the other is Gaussian, then the deficit in the EPI can be controlled in terms of the L-Wa...

متن کامل

Entropy Jumps for Radially Symmetric Random Vectors

We establish a quantitative bound on the entropy jump associated to the sum of independent, identically distributed (IID) radially symmetric random vectors having dimension greater than one. Following the usual approach, we first consider the analogous problem of Fisher information dissipation, and then integrate along the Ornstein-Uhlenbeck semigroup to obtain an entropic inequality. In a depa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015